# Synthetic data distillation
Phi 4 Mini Reasoning Unsloth Bnb 4bit
MIT
Phi-4-mini-reasoning is a lightweight open-source model focused on mathematical reasoning, supporting a context length of 128K tokens and suitable for environments with limited computing resources.
Large Language Model
Transformers Supports Multiple Languages

P
unsloth
2,329
5
Neuralhermes 2.5 Mistral 7B
Apache-2.0
NeuralHermes is a large language model based on OpenHermes-2.5-Mistral-7B, further fine-tuned through Direct Preference Optimization (DPO), demonstrating excellent performance across multiple benchmarks.
Large Language Model
Transformers English

N
mlabonne
215
154
Featured Recommended AI Models